Limited memory restarted ℓp-ℓq minimization methods using generalized Krylov subspaces

نویسندگان

چکیده

Abstract Regularization of certain linear discrete ill-posed problems, as well regression can be formulated large-scale, possibly nonconvex, minimization whose objective function is the sum p th power ℓ -norm a fidelity term and q regularization term, with 0 < , ≤ 2. We describe new restarted iterative solution methods that require less computer storage execution time than described by Huang et al. (BIT Numer. Math. 57 ,351–378, 14). The reduction in achieved periodic restarts method. Computed examples illustrate restarting does not reduce quality computed solutions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Majorization-minimization generalized Krylov subspace methods for lp-lq optimization applied to image restoration

A new majorization-minimization framework for lp-lq image restoration is presented. The solution is sought in a generalized Krylov subspace that is build up during the solution process. Proof of convergence to a stationary point of the minimized lp-lq functional is provided for both convex and nonconvex problems. Computed examples illustrate that high-quality restorations can be determined with...

متن کامل

Convergence of Restarted Krylov Subspaces to Invariant Subspaces

The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired se...

متن کامل

Iterative Methods Based on Krylov Subspaces

with a symmetric and positive definite operator A defined on a Hilbert space V with dimV = N and its preconditioned version PCG. We derive the CG from the projection in A-inner product point of view. We shall use (·, ·) for the standard inner product and (·, ·)A for the inner product introduced by the SPD operator A. When we say ‘orthogonal’ vectors, we refer to the default (·, ·) inner product...

متن کامل

Restarted Generalized Second-Order Krylov Subspace Methods for Solving Quadratic Eigenvalue Problems

This article is devoted to the numerical solution of large-scale quadratic eigenvalue problems. Such problems arise in a wide variety of applications, such as the dynamic analysis of structural mechanical systems, acoustic systems, fluid mechanics, and signal processing. We first introduce a generalized second-order Krylov subspace based on a pair of square matrices and two initial vectors and ...

متن کامل

Limited-memory projective variable metric methods for unconstrained minimization

A new family of limited-memory variable metric or quasi-Newton methods for unconstrained minimization is given. The methods are based on a positive definite inverse Hessian approximation in the form of the sum of identity matrix and two low rank matrices, obtained by the standard scaled Broyden class update. To reduce the rank of matrices, various projections are used. Numerical experience is e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in Computational Mathematics

سال: 2023

ISSN: ['1019-7168', '1572-9044']

DOI: https://doi.org/10.1007/s10444-023-10020-8